An affine scaling method for optimization problems with polyhedral constraints

نویسندگان

  • William W. Hager
  • Hongchao Zhang
چکیده

Recently an affine scaling, interior point algorithm ASL was developed for box constrained optimization problems with a single linear constraint (GonzalezLima et al., SIAM J. Optim. 21:361–390, 2011). This note extends the algorithm to handle more general polyhedral constraints. With a line search, the resulting algorithm ASP maintains the global and R-linear convergence properties of ASL. In addition, it is shown that the unit step version of the algorithm (without line search) is locally R-linearly convergent at a nondegenerate local minimizer where the secondorder sufficient optimality conditions hold. For a quadratic objective function, a sublinear convergence property is obtained without assuming either nondegeneracy or the second-order sufficient optimality conditions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An augmented Lagrangian affine scaling method for nonlinear programming

In this paper, we propose an Augmented Lagrangian Affine Scaling (ALAS) algorithm for general nonlinear programming, for which a quadratic approximation to the augmented Lagrangian is minimized at each iteration. Different from the classical sequential quadratic programming (SQP), the linearization of nonlinear constraints is put into the penalty term of this quadratic approximation, which resu...

متن کامل

Response surface methodology with stochastic constraints for expensive simulation

This paper investigates simulation-based optimization problems with a stochastic objective function, stochastic output constraints, and deterministic input constraints. More specifically, it generalizes classic Response Surface Methodology (RSM) to account for these stochastic and deterministic constraints. This extension is obtained through the generalization of the estimated steepest descent—...

متن کامل

Error Bound and Reduced-Gradient Projection Algorithms for Convex Minimization over a Polyhedral Set

Consider the problem of minimizing, over a polyhedral set, the composition of an afline mapping with a strongly convex differentiable function. The polyhedral set is expressed as the intersection of an affine set with a (simpler) polyhedral set and a new local error bound for this problem, based on projecting the reduced gradient associated with the affine set onto the simpler polyhedral set, i...

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

An Affine-Scaling Interior-Point Method for Continuous Knapsack Constraints with Application to Support Vector Machines

An affine-scaling algorithm (ASL) for optimization problems with a single linear equality constraint and box restrictions is developed. The algorithm has the property that each iterate lies in the relative interior of the feasible set. The search direction is obtained by approximating the Hessian of the objective function in Newton’s method by a multiple of the identity matrix. The algorithm is...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 59  شماره 

صفحات  -

تاریخ انتشار 2014